Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

platform framework #2193

Merged
merged 51 commits into from
Mar 29, 2023
Merged

platform framework #2193

merged 51 commits into from
Mar 29, 2023

Conversation

lxning
Copy link
Collaborator

@lxning lxning commented Mar 22, 2023

Description

Please read our CONTRIBUTING.md prior to creating your first pull request.

Please include a summary of the feature or issue being fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

It covers the following subtasks in the internal design

  • model archiver: support model config yaml file by adding parameter -c.
  • torchrun integration
  • gpu assignment: allow user config device Ids for model
  • communication b/w frontent and backend
  • socket assignment.
  • monitoring and logging: support rpc job log and metrics.

Fixes #(issue)
#2192

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • Test A
    Logs for Test A

  • Test B
    Logs for Test B
    reg.txt

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@@ -157,6 +157,7 @@ def create_mar_file(work_dir, session_mocker, jit_file_path, model_archiver):
extra_files=os.path.join(EXAMPLE_ROOT_DIR, "index_to_name.json"),
export_path=work_dir,
requirements_file=None,
config_file=None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's the difference btetween config_file and yaml_config_file that shows up in the torchrec test?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here. the config_file is the model config yaml file. I don't see yaml_config_file in TS code.

@@ -22,7 +21,10 @@
DEBUG = False
BENCHMARK = os.getenv("TS_BENCHMARK")
BENCHMARK = BENCHMARK in ["True", "true", "TRUE"]

LOCAL_RANK = int(os.getenv('LOCAL_RANK', 0))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

by default i don't believe these environment variables will be set in a single GPU deployment

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here, this value is used to calculate rpc job's port. When torchrun does not run. 0 is returned so that the normal backend worker is able to get correct port.

private int parallelLevel = 1;
private ParallelType parallelType = ParallelType.NONE;

public static ModelConfig build(Map<String, Object> yamlMap) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like a schema, will this reduce flexibility for users trying to add their own configs are these configs from the YAML that matter to us. Curious why the frontend needs to be aware of of these in the first place

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ModleConfig is the schema for frontend parameters which are completely controlled by TS. Users are only interested in the parameters of handler side. So they are only allowed to extend parameters in backend.

@codecov
Copy link

codecov bot commented Mar 24, 2023

Codecov Report

Merging #2193 (b2eb2fb) into master (6b5d8c8) will decrease coverage by 0.13%.
The diff coverage is 62.16%.

❗ Current head b2eb2fb differs from pull request most recent head 333c189. Consider uploading reports for the commit 333c189 to get more accurate results

@@            Coverage Diff             @@
##           master    #2193      +/-   ##
==========================================
- Coverage   71.57%   71.44%   -0.13%     
==========================================
  Files          73       73              
  Lines        3307     3338      +31     
  Branches       57       57              
==========================================
+ Hits         2367     2385      +18     
- Misses        937      950      +13     
  Partials        3        3              
Impacted Files Coverage Δ
...l-archiver/model_archiver/model_packaging_utils.py 60.13% <ø> (ø)
ts/utils/util.py 71.42% <33.33%> (-3.22%) ⬇️
ts/service.py 71.26% <47.05%> (-6.21%) ⬇️
ts/model_service_worker.py 67.64% <90.90%> (+1.75%) ⬆️
model-archiver/model_archiver/model_packaging.py 90.32% <100.00%> (+0.32%) ⬆️
ts/context.py 67.53% <100.00%> (+0.42%) ⬆️
ts/tests/unit_tests/test_model_service_worker.py 99.13% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@msaroufim msaroufim self-requested a review March 24, 2023 16:21
@lxning lxning merged commit d662c26 into master Mar 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants